Console Output
Training and evaluating model for: Washing Machine
Dataset length: 22402 windows
NILMModel(
(conv1d): Conv1d(9, 9, kernel_size=(3,), stride=(1,), padding=(1,))
(lstm): LSTM(9, 256, num_layers=4, batch_first=True, dropout=0.1)
(dropout): Dropout(p=0.1, inplace=False)
(relu): ReLU()
(output_layer): Linear(in_features=256, out_features=1, bias=True)
)
Epoch [1/300], Train Loss: 0.002906
Validation Loss: 0.002972
Epoch [2/300], Train Loss: 0.002370
Validation Loss: 0.002488
Epoch [3/300], Train Loss: 0.002224
Validation Loss: 0.002336
Epoch [4/300], Train Loss: 0.002167
Validation Loss: 0.002205
Epoch [5/300], Train Loss: 0.002056
Validation Loss: 0.002076
Epoch [6/300], Train Loss: 0.001994
Validation Loss: 0.002288
Epoch [7/300], Train Loss: 0.001960
Validation Loss: 0.001895
Epoch [8/300], Train Loss: 0.001826
Validation Loss: 0.001922
Epoch [9/300], Train Loss: 0.001784
Validation Loss: 0.001769
Epoch [10/300], Train Loss: 0.001730
Validation Loss: 0.001891
Epoch [11/300], Train Loss: 0.001672
Validation Loss: 0.001919
Epoch [12/300], Train Loss: 0.001929
Validation Loss: 0.003337
Epoch [13/300], Train Loss: 0.002266
Validation Loss: 0.001811
Epoch [14/300], Train Loss: 0.001886
Validation Loss: 0.001794
Epoch [15/300], Train Loss: 0.001712
Validation Loss: 0.001757
Epoch [16/300], Train Loss: 0.001710
Validation Loss: 0.001819
Epoch [17/300], Train Loss: 0.001629
Validation Loss: 0.001689
Epoch [18/300], Train Loss: 0.001631
Validation Loss: 0.001588
Epoch [19/300], Train Loss: 0.001599
Validation Loss: 0.001572
Epoch [20/300], Train Loss: 0.001571
Validation Loss: 0.001544
Epoch [21/300], Train Loss: 0.001554
Validation Loss: 0.001864
Epoch [22/300], Train Loss: 0.001488
Validation Loss: 0.001445
Epoch [23/300], Train Loss: 0.001426
Validation Loss: 0.001476
Epoch [24/300], Train Loss: 0.001420
Validation Loss: 0.001390
Epoch [25/300], Train Loss: 0.001392
Validation Loss: 0.001342
Epoch [26/300], Train Loss: 0.001337
Validation Loss: 0.001223
Epoch [27/300], Train Loss: 0.001250
Validation Loss: 0.001143
Epoch [28/300], Train Loss: 0.001326
Validation Loss: 0.001245
Epoch [29/300], Train Loss: 0.001282
Validation Loss: 0.001387
Epoch [30/300], Train Loss: 0.001299
Validation Loss: 0.001172
Epoch [31/300], Train Loss: 0.001239
Validation Loss: 0.001116
Epoch [32/300], Train Loss: 0.001146
Validation Loss: 0.001017
Epoch [33/300], Train Loss: 0.001067
Validation Loss: 0.001342
Epoch [34/300], Train Loss: 0.001148
Validation Loss: 0.000898
Epoch [35/300], Train Loss: 0.001098
Validation Loss: 0.001066
Epoch [36/300], Train Loss: 0.001110
Validation Loss: 0.001503
Epoch [37/300], Train Loss: 0.001211
Validation Loss: 0.001218
Epoch [38/300], Train Loss: 0.001121
Validation Loss: 0.000991
Epoch [39/300], Train Loss: 0.001039
Validation Loss: 0.000955
Epoch [40/300], Train Loss: 0.001161
Validation Loss: 0.000963
Epoch [41/300], Train Loss: 0.000934
Validation Loss: 0.000844
Epoch [42/300], Train Loss: 0.000910
Validation Loss: 0.000914
Epoch [43/300], Train Loss: 0.000910
Validation Loss: 0.000933
Epoch [44/300], Train Loss: 0.000965
Validation Loss: 0.000847
Epoch [45/300], Train Loss: 0.000867
Validation Loss: 0.001093
Epoch [46/300], Train Loss: 0.000989
Validation Loss: 0.000729
Epoch [47/300], Train Loss: 0.000816
Validation Loss: 0.000698
Epoch [48/300], Train Loss: 0.000788
Validation Loss: 0.000716
Epoch [49/300], Train Loss: 0.000757
Validation Loss: 0.000632
Epoch [50/300], Train Loss: 0.000772
Validation Loss: 0.001085
Epoch [51/300], Train Loss: 0.000774
Validation Loss: 0.000762
Epoch [52/300], Train Loss: 0.000708
Validation Loss: 0.000607
Epoch [53/300], Train Loss: 0.000682
Validation Loss: 0.000593
Epoch [54/300], Train Loss: 0.000744
Validation Loss: 0.000604
Epoch [55/300], Train Loss: 0.000709
Validation Loss: 0.000557
Epoch [56/300], Train Loss: 0.000640
Validation Loss: 0.000636
Epoch [57/300], Train Loss: 0.000726
Validation Loss: 0.000626
Epoch [58/300], Train Loss: 0.000658
Validation Loss: 0.000551
Epoch [59/300], Train Loss: 0.000732
Validation Loss: 0.000604
Epoch [60/300], Train Loss: 0.000675
Validation Loss: 0.000508
Epoch [61/300], Train Loss: 0.000630
Validation Loss: 0.000883
Epoch [62/300], Train Loss: 0.000627
Validation Loss: 0.000735
Epoch [63/300], Train Loss: 0.000561
Validation Loss: 0.000719
Epoch [64/300], Train Loss: 0.000667
Validation Loss: 0.000726
Epoch [65/300], Train Loss: 0.000616
Validation Loss: 0.000555
Epoch [66/300], Train Loss: 0.000540
Validation Loss: 0.000778
Epoch [67/300], Train Loss: 0.000600
Validation Loss: 0.000466
Epoch [68/300], Train Loss: 0.000498
Validation Loss: 0.000533
Epoch [69/300], Train Loss: 0.000483
Validation Loss: 0.000607
Epoch [70/300], Train Loss: 0.000792
Validation Loss: 0.000939
Epoch [71/300], Train Loss: 0.001238
Validation Loss: 0.000647
Epoch [72/300], Train Loss: 0.000650
Validation Loss: 0.000813
Epoch [73/300], Train Loss: 0.000574
Validation Loss: 0.000538
Epoch [74/300], Train Loss: 0.000559
Validation Loss: 0.000470
Epoch [75/300], Train Loss: 0.000523
Validation Loss: 0.000486
Epoch [76/300], Train Loss: 0.000440
Validation Loss: 0.000426
Epoch [77/300], Train Loss: 0.000458
Validation Loss: 0.000419
Epoch [78/300], Train Loss: 0.000473
Validation Loss: 0.000410
Epoch [79/300], Train Loss: 0.000452
Validation Loss: 0.000386
Epoch [80/300], Train Loss: 0.000450
Validation Loss: 0.000428
Epoch [81/300], Train Loss: 0.000442
Validation Loss: 0.000627
Epoch [82/300], Train Loss: 0.000410
Validation Loss: 0.000372
Epoch [83/300], Train Loss: 0.000411
Validation Loss: 0.000376
Epoch [84/300], Train Loss: 0.000399
Validation Loss: 0.000393
Epoch [85/300], Train Loss: 0.000415
Validation Loss: 0.000374
Epoch [86/300], Train Loss: 0.000478
Validation Loss: 0.000390
Epoch [87/300], Train Loss: 0.000462
Validation Loss: 0.000489
Epoch [88/300], Train Loss: 0.000379
Validation Loss: 0.000409
Epoch [89/300], Train Loss: 0.000464
Validation Loss: 0.000422
Epoch [90/300], Train Loss: 0.000448
Validation Loss: 0.000373
Epoch [91/300], Train Loss: 0.000380
Validation Loss: 0.000352
Epoch [92/300], Train Loss: 0.000346
Validation Loss: 0.000350
Epoch [93/300], Train Loss: 0.000373
Validation Loss: 0.000358
Epoch [94/300], Train Loss: 0.000337
Validation Loss: 0.000358
Epoch [95/300], Train Loss: 0.000359
Validation Loss: 0.000514
Epoch [96/300], Train Loss: 0.000389
Validation Loss: 0.000370
Epoch [97/300], Train Loss: 0.000323
Validation Loss: 0.000533
Epoch [98/300], Train Loss: 0.000462
Validation Loss: 0.000356
Epoch [99/300], Train Loss: 0.000332
Validation Loss: 0.000326
Epoch [100/300], Train Loss: 0.000340
Validation Loss: 0.000450
Epoch [101/300], Train Loss: 0.000411
Validation Loss: 0.000340
Epoch [102/300], Train Loss: 0.000311
Validation Loss: 0.000386
Epoch [103/300], Train Loss: 0.000430
Validation Loss: 0.000771
Epoch [104/300], Train Loss: 0.000419
Validation Loss: 0.000397
Epoch [105/300], Train Loss: 0.000327
Validation Loss: 0.000355
Epoch [106/300], Train Loss: 0.000322
Validation Loss: 0.001134
Epoch [107/300], Train Loss: 0.001866
Validation Loss: 0.001238
Epoch [108/300], Train Loss: 0.000745
Validation Loss: 0.000514
Epoch [109/300], Train Loss: 0.000368
Validation Loss: 0.000340
Early stopping triggered
Evaluating model for: Washing Machine
Validation MAE: 10.017307 W
Validation MSE: 2343.155029 W²
Validation RMSE: 48.406147 W
Signal Aggregate Error (SAE): 0.027555
Normalized Disaggregation Error (NDE): 0.265927
Training and Validation Loss
Interactive Plot